![]() Device for detecting drug influence based on visual data
专利摘要:
An eye-scanner device (120) for detecting drug influence in a person based on a visual representation of an eye (110) of the person, the device comprising a vision-based sensor (121), a light source (122) and a control unit (125, 135), wherein the vision-based sensor (121) is arranged to capture a visual representation of the eye (110), the control unit (125, 135) is arranged to determine a pupil size metric, a pupil’s reaction to light metric, a horizontal and/or vertical nystagmus metric, and an eye crossing ability metric, wherein the control unit (125, 135) is arranged to detect drug influence in the person based on the pupil size metric, the pupil’s reaction to light metric, the horizontal and/or vertical nystagmus metric, and the eye crossing ability metric. 公开号:SE2030188A1 申请号:SE2030188 申请日:2020-06-08 公开日:2021-02-28 发明作者:Jenny Johansson;Stefanie Najafi 申请人:Eyescanner Tech Sweden Ab; IPC主号:
专利说明:
TITLE DEVICE FOR DETECTING DRUG INFLUENCE BASED ON VISUAL DATA TECHNICAL FIELD The present disclosure relates to methods, uses, and portable devices fordetecting drug influence in a person based on a visual representation of aneye of the person captured by one or more vision-based sensors. BACKGROUND lt is sometimes desired to determine whether a person is under the influenceof one or more drugs, e.g., in a traffic police control situation. A common wayof determining if a person is under the influence of drugs is to manually observethe eyes of the person in order to detect signs of drug influence. l/lanual testingis, however, time consuming, and may not be entirely reliable since the testresult is at least partly influenced by the technique and judgement ability of theperson or persons carrying out the manual test. Different types of drugs alsogive different symptoms, which means that some experience is required by theperson carrying out the test in order to reliably detect drug influence. Some work has been made towards automating the procedure in order toincrease the efficiency and reliability of drug influence detection; US 2018/0333092 A1 discloses an ocular response testing device for testingeyes of a patient to screen for driving under the influence of drugs or alcohol. US 9,357,918 B1 discloses a system and method for drug screening and monitoring pupil reactivity and voluntary and involuntary eye muscle function. Automating the test procedure may provide more reliable test results obtainedin a more efficient manner. There is, however, a continuing need for morerobust ways to automatically determine is a person is under the influence ofdrugs. Also, the necessary equipment for carrying out the automated tests should be kept at a minimum. SUMMARY lt is an object of the present disclosure to provide an improved device fordetecting drug influence in a person. This object is at least in part obtained by an eye-scanner device for detectingdrug influence in a person based on a visual representation of an eye of theperson. The device comprises a vision-based sensor, a light source, and acontrol unit. The vision-based sensor is arranged to capture a visualrepresentation of the eye. The control unit is arranged to determine a pupil sizemetric, a pupil's reaction to light metric, a horizontal and/or vertical nystagmusmetric, and an eye crossing ability metric. The control unit is arranged to detectdrug influence in the person based on the pupil size metric, the pupil's reactionto light metric, the horizontal and/or vertical nystagmus metric, and the eyecrossing ability metric. The disclosed device measures four key metrics on which the drug detectiontest is based, which provides for a reliable detection. The proposed devicedoes not require contact with the persons face, which is an advantage. Thetest procedure provides reliable test results obtained in an efficient mannerand in a robust way, while keeping the necessary equipment for carrying outthe automated tests at a minimum. The device is portable and provides for aconvenient drug testing device which can be incorporated in existingsmartphones, which is an advantage. According to aspects, the vision-based sensor comprises a camera sensorarranged on a smartphone, tablet, or a mobile computing device. This way,bulky vision-based sensor equipment may be omitted, which provides more convenient system. According to aspects, the light source is comprised in a smartphone, tablet, ora mobile computing device. This way, additional light source equipment may be omitted, which provides more convenient system. According to aspects, the control unit is arranged in physical connection to thevision-based sensor. This way, the detection of drug influence in the personcan be done directly in the eye-scanner device. According to aspects, the control unit is arranged distanced from the vision-based sensor in an external processing resource. This way, the vision-basedsensor can upload captured image data of the eye to the external processingresource, preferably via wireless link. This allows for more processing powerto be utilized. According to aspects, drug influence in the person is detected if one or moreof the metrics fail to meet a respective metric pass criterion. This provides aquick and efficient way of detection of drug influence in the person. According to aspects, drug influence in the person is detected based on aweighted combination of the metrics. This provides a more advanced way ofdetection of drug influence in the person. According to aspects, the vision-based sensor is arranged to capture a timesequence of images of the eye and/or a video sequence of the eye. Theimages and/or video sequence is then subject to image processing in order to determine the four different measurement metrics discussed above. According to aspects, the control unit is arranged to gather one or morestatistics associated with the metrics, to anonymize the gathered statistics, andto store the anonymized statistics. This enables analysis of drug influence. Thedata points may also be associated with respective time stamps and/orgeographical positions. The geographical positions may be quantized on, e.g., city or area level in order to not impact anonymization. According to aspects, the eye-scanner device comprises a display with avisible marker. The display can then be used to provide instructions to the testsubject for, e.g., moving the gaze or directing the gaze at a given location onthe screen. According to aspects, the eye-scanner device comprises a distance sensorarranged to determine the distance between the vision-based sensor and the eye of the person. This can facilitate in obtaining the size measurements andmovement measurements, discussed above, of the four tests in terms of absolute values. According to aspects, the distance sensor comprises a camera sensorarranged on a smartphone, tablet, or a mobile computing device. This provides a convenient system. According to aspects, the distance sensor is arranged to be calibrated using aknown reference in proximity to eye. This can improve accuracy of the determined distance. According to aspects, the distance sensor comprises a plurality of camerasensors. The distance can thus be determined from techniques involving, i.a.,distances between camera lenses (i.e. sensors), field of view angels, and geometric derivations. According to aspects, the distance sensor comprises a radar, a light detectionand ranging (LIDAR) device, and/or any other light-based ranging device, suchas depth sensing using infrared projection. This can improve accuracy of the determined distance. According to aspects, the display comprises one or more test indicatorsarranged to indicate that a capture of a visual representation of the eye isongoing. This can guide the person under test to keep looking at the device until the capture is complete. According to aspects, the display comprises one or more distance indicatorsarranged to indicate that the distance between the vision-based sensor andthe eye of the person is within a preferred range. This provides clear feedbackto the person under test which enables the preferred distance to be maintained. According to aspects, the display comprises one or more capture indicatorsarranged to indicate that that the vision-based sensor is capable of capturingthe visual representation of the eye. The capture indicators can thereby guidethe person under test to maintain capturing capabilities. According to aspects, the eye-scanner comprises a speaker device arrangedto communicate instructions to the person. This is way, the person can beguided while not directly looking at the display, which is an advantage. According to aspects, the pupil size metric, the pupil's reaction to light metric,the horizontal and/or vertical nystagmus metric, and the eye crossing abilitymetric are, at least partly, determined based on relative measurements. Therelative measurements may eliminate the need of determining the absolutedistance between the vision-based sensor and the eye of the person, or atleast reduce the required accuracy of the distance, depending on the desired accuracy of the final measurement metrics of the four tests. There is furthermore disclosed herein control units, computer programs,computer readable media, computer program products, and uses associated with the above discussed advantages. The methods disclosed herein are associated with the same advantages as discussed above in connection to the eye-scanner device. Generally, all terms used in the claims are to be interpreted according to theirordinary meaning in the technical field, unless explicitly defined otherwiseherein. All references to "a/an/the element, apparatus, component, means,step, etc." are to be interpreted openly as referring to at least one instance ofthe element, apparatus, component, means, step, etc., unless explicitly statedotherwise. The steps of any method disclosed herein do not have to beperformed in the exact order disclosed, unless explicitly stated. Furtherfeatures of, and advantages with, the present invention will become apparentwhen studying the appended claims and the following description. The skilledperson realizes that different features of the present invention may becombined to create embodiments other than those described in the following,without departing from the scope of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS Aspects of the present disclosure will now be described more fully withreference to the accompanying drawings, where Figures 1-2 schematically illustrate example eye-scanner systems;Figures 3A, SB schematically illustrate eye measurement metrics;Figure 4 is a flow chart illustrating details of example methods;Figure 5 schematically illustrates a processing device; and Figure 6 schematically illustrates a computer program product. DETAILED DESCRIPTION The invention will now be described more fully hereinafter with reference to theaccompanying drawings, in which certain aspects of the invention are shown.This invention may, however, be embodied in many different forms and shouldnot be construed as limited to the embodiments and aspects set forth herein;rather, these embodiments are provided by way of example so that thisdisclosure will be thorough and complete, and will fully convey the scope ofthe invention to those skilled in the art. Like numbers refer to like elements throughout the description. lt is to be understood that the present invention is not limited to theembodiments described herein and illustrated in the drawings; rather, theskilled person will recognize that many changes and modifications may be made within the scope of the appended claims. There is disclosed herein a digital tool referred to as an Eye-scanner, whichvia the camera of a smartphone or other vision-based sensor device,automatically detects drug-affected persons. The disclosed methods anddevices allow for a more efficient and reliable detection of drug influence in aperson. Compared to known devices for detecting drug influence, theproposed device does not require contact with the persons face, such as theski goggle-like frame described in US 9,357,918 B1. The Eye-scanner is arranged to, within a few minutes, measure four differentmeasurement values or metrics by eye tracking and pupil observation: pupilsize, pupil's reaction to light, control of horizontal and/or vertical nystagmus,and the ability of the person to cross eyes. Thus, the disclosed devicemeasures four key metrics on which the drug detection test is based, whichprovides for a more reliable detection result compared to the prior art based on measurement of less than four metrics. Figure 1 shows a system 100 for detecting drug influence in a person basedwhich is based on an eye-scanner device 120. The system comprises a vision-based sensor 121 for capturing visual representations such as an image, atime sequence of images, and/or a video sequence of an eye 110 of the personsubject to drug influence testing. The system 100 also comprises a light source122 and a control unit 125 and/or external processing resources 130 fordetecting drug influence in a person by image processing of the captured visualrepresentation. The external processing resources may comprise data storagefor storing anonymized test results and statistics. Figure 2 shows an embodiment 200 of the proposed system where the vision-based sensor 121, light source 122, and control unit 125 is comprised in asmartphone 210, tablet, or the like. This system is portable and provides for aconvenient drug testing device which can be incorporated in existing smartphones, which is an advantage. The light source 122 may be a photography light source normally found onsmart phones, which can be used to stimulate the eye by turning the lightsource on and off. Such light sources may also be arranged with a controllablelight intensity. According to some aspects, the vision-based sensor 121 is a stereo visionsensor comprising more than one vision sensing element, e.g., two or morecamera lenses. The vision-based sensor can then be arranged to determine a distance from the sensor to the pupil or pupils, and thereby estimate pupil size. ln general, detecting the location and orientation of a pupil in an eye is knownand will therefore not be discussed in more detail herein. With reference to Figures 3A and 3B, the pupil can be measured in terms of size 310, and itshorizontal 330 and vertical 340 motion can be tracked over a time sequenceof images or through a video sequence. The eye-scanner 120 and/or the external processing resource 130 comprisecontrol units 125, 135 configured to perform image processing, and inparticular to locate the pupil in a captured image or sequence of images. Thishardware will be discussed in more detail below in connection to Figure 5. The four different measurement metrics on which the drug detection test is based will now be described in detail;l/leasurement metric 1 - pupil size Figure 3A shows an eye 300 where a pupil size measure 310 is indicated. Thepupil size test checks if the pupil size is within the range of 2.0-8.0 mm, andpreferably in the range 3.0-6.5 mm. According to some aspects, if the personunder test falls outside the normal range, a positive result is generated,otherwise a negative result is generated. According to some aspects the pupil size is used to derive a first weightingvalue which is based on how far outside the range the pupils of the personunder test is. Thus, if the pupil size is well within normal range then a relativelow value is output, which value then increases as the pupil size comes closerto the boundary of the normal range. The value then continues to increase asthe pupil size goes beyond the normal range. This way the 'normality' of a pupilsize can be quantified more finely, e.g., on a scale from 1 to 100, compared to a binary normal/abnormal quantification.l/leasurement metric 2 - measurement of pupil's reaction to light The test measures whether the pupil responds to light. Light of a given intensityis first directed in the eye. The pupil is then monitored for response to the light.Slow or no reaction to light generates a positive result. What constitutes a“slow” reaction can be pre-configured. This metric may also be represented asa second weighting value, e.g., on a scale from 0 to 100. A very fast reactionto light then gives a low value, e.g., below 10, while a slower reaction results in a higher value. Non-existent reactions would give a result on the order of100. Again, the reaction to light can then be quantified more finely comparedto the binary slow/fast. According to aspects, the ambient light is taken intoconsideration when determining the value of the response metric. Forinstance, a less pronounced reaction can be expected to light of a givenintensity if the eye has previously been subjected to strong daylight for anextended period of time, compared to an eye which has not been subject tostrong ambient light, e.g., at nighttime. The light source used for testing, i.e.,the light of the given intensity, can be modulated in terms of this intensity independence of an intensity of the ambient light. l/leasurement metric 3 - control of horizontal and/or vertical nystagmus Nystagmus is a condition of involuntary eye movement. Due to the involuntary movement of the eye, it has also sometimes been called "dancing eyes". Figure 3A illustrates vertical 340 and horizontal eye movement 330. The testchecks whether involuntary twitching occurs when the person is looking fromside to side (horizontal) or up and then down (vertical). Horizontal and/orvertical jerking of the pupil, i.e., if the pupil does not move linearly from endpoint to end point, generates a positive result. Again, the twitching can bequantized on a finer scale from O to 100. The eye-scanner 120 may comprise a display 220 configured to show a markerwhich the test subject can be instructed to follow with the eye 110. The markercan then move from side to side or up and down, while the pupil motion in monitored for any non-smooth motion indicating a level of nystagmus.l/leasurement value 4 - checking the ability to cross the eyes. Crossing one's eyes means to direct left and right eye gaze towards a pointlocated close to the eyes and centered between the eyes, as illustrated inFigure 3B. Eye crossing occurs, e.g., when trying to follow an object with botheyes as the object approaches the nose of the person. The test checks the person's ability to cross his or her eyes when following anobject with the eyes. lnability to cross eyes generates a positive outcome. The display 220 can be used to assist in measuring this metric. Similar to thenystagmus metric, a marker can be shown on the display at a location whereeye crossing is expected in a person not influenced by drugs. The movementof the pupils can then be monitored in order to discern if the test subject iscapable of crossing eyes or not. Again, the relative location of the pupils canbe detected based on image processing of the captured visual representation. When the four tests have been performed, a merging and analysis of themeasured values takes place. These provide a recommendation for further management. E.g., additional drug tests such as blood or urine tests. According to aspects, a preliminary analysis is carried out after each of thefour tests to determine if the test was successful, i.e. to investigate if enoughdata has been collected for the analysis discussed above. lf, e.g., the view ofthe vision-based sensor is obstructed during a test, the test is deemedunsuccessful and a re-testing is prompted. According to one example, a detection of drug influence and potentialrecommendation for further testing is triggered in case one or more of the fourtests are failed, i.e., if one or more measured metric do not meet respectivepre-determined pass criteria. Alternatively, a combination of the aboveweighting values can be used as base for detecting drug influence. Forinstance, four weighting values can be added together to represent drug influence likelihood on a scale from 0 to 400. According to some aspects, the results are anonymized and stored on aremote server such as the external processing resource 130. This is in orderto later be able to use the statistics the results generate. To summarize, Figures 1 and 2 schematically illustrate example eye-scannerdevices 120 for detecting drug influence in a person based on a visualrepresentation of an eye 110 of the person. The device comprises a vision-based sensor 121, a light source 122 and a control unit 125, 135. The vision-based sensor 121 is arranged to capture a visual representation of the eye110. The control unit 125, 135 is arranged to process the visual representationin order to determine the four metrics discussed above; a pupil size metric 310, 11 a pupil's reaction to light metric 320, a horizontal and/or vertical nystagmusmetric 330, 340, and an eye crossing ability metric 350. The control unit 125,135 is arranged to detect drug influence in the person based on the pupil sizemetric 310, the pupil's reaction to light metric 320, the horizontal and/or vertical nystagmus metric 330, 340, and the eye crossing ability metric 350. According to some aspects, the vision-based sensor 121 comprises a camerasensor arranged on a smartphone 210, tablet, or a mobile computing device.Thus, existing smart devices such as mobile smartphones and the like can beused to implement the herein proposed techniques. The vision-based sensor121 may be arranged to capture a time sequence of images of the eye 110,and/or a video sequence of the eye 110. The images and/or video sequenceis then subject to image processing in order to determine the four different measurement metrics discussed above. The light source 320 may also be comprised in the smartphone 210, tablet, or mobile computing device. As mentioned, the device 120 does not require contact with the persons faceduring the measurements. Therefore, the eye-scanner devices 120 maycomprise a distance sensor 126 arranged to determine the distance betweenthe vision-based sensor 121 and the eye 110 of the person. By using thedetermined distance, the control unit can carry out the analysis on measuredvalues captured in a range of different distances, e.g. between 0 and 2 meters.Preferably, however, the distance is in a preferred range of 10 to 50 centimeters. According to some aspects, the distance sensor 126 comprises a camerasensor arranged on a smartphone 210, tablet, or a mobile computing device.Thus, existing smart devices such as mobile smartphones and the like can beused to implement the herein proposed techniques. The determined distancemay be determined, e.g. using reference sizes or estimated reference sizes.ln an example embodiment, the distance sensor 126 is arranged to becalibrated using a known reference in proximity to eye 110 of the person under 12 test before or after the tests. The known reference may, e.g., be a paper witha checkerboard with known dimensions, or something similar. The distance sensor 126 may comprise a plurality of camera sensors, e.g. asin a stereoscopic camera. Thus, existing smart devices such as mobilesmartphones and the like can be used to implement the herein proposedtechniques. The distance can thus be determined from techniques involving,i.a., distances between camera lenses (i.e. sensors), field of view angels, andgeometric derivations. According to some aspects, the distance sensor 126 comprises a radar, a lightdetection and ranging (LIDAR) device, and/or any other light-based rangingdevice, such as depth sensing using infrared projection. Thus, existing smartdevices such as mobile smartphones and the like can be used to implementthe herein proposed techniques. The distance sensor 126 may comprise an ultra-sonic ranging device. Three-dimensional ultrasound technology for human-machine interaction is anemerging technology which may be an integral part of future smartphones andthe like. The size measurements and movement measurements, discussed above, ofthe four tests may be measured in terms of absolute values. However,alternatively, or in combination of, these measurements may be measured inrelative terms. ln other words, the pupil size metric 310, the pupil's reaction tolight metric 320, the horizontal and/or vertical nystagmus metric 330, 340, andthe eye crossing ability metric 350 are, at least partly, determined based onrelative measurements. For example, the size of the pupil (e.g. its diameter)may be measured relative to the height of the whole eye or the size of the iris(e.g. its diameter). Other normalizations are also possible. The relativemeasurements may eliminate the need of determining the absolute distancebetween the vision-based sensor 121 and the eye 110 of the person, or at leastreduce the required accuracy of the distance, depending on the desiredaccuracy of the final measurement metrics of the four tests. 13 The eye-scanner device 120 optionally also comprises a display 220 with avisible marker or the like. The display can then be used to provide instructionsto the test subject for, e.g., moving the gaze or directing the gaze at a givenlocation on the screen. For instance, when determining the eye-crossing abilitymetric, the test person may be directed to look at a marker on the screen,which marker is positioned so as to induce eye crossing. The display may alsobe configured to show markers moving from right to left and/or up and down inorder to stimulate horizontal or vertical pupil movement during a nystagmustest. The display 220 may also display the sequence of images and/or videosequence of the eye 110 in a live feed. The display may comprise one or more test indicators arranged to indicate thata capture of a visual representation of the eye 110 (i.e. a test) is ongoing. Theindicators can, for example, be circles displayed on top of the pupils in the livefeed. These circles can track the pupils on the display continuously during thecapture. The circles may only be present during one of the tests. Alternatively,the circles are present at all times during the capture and change shape orcolor during one of the tests. This can guide the person under test to keeplooking at the device until the capture is complete. The display may comprise one or more distance indicators arranged to indicatethat the distance between the vision-based sensor 121 and the eye 110 of theperson is within a preferred range. lf the distance is larger than preferred, theindicators may change shape or color. lf the distance is smaller value thanpreferred, the indicators may change in a different way. The distance indicatorsmay also be circles displayed on top of the pupils. As an example, the circlesare green when the distance is in the preferred range, yellow when thedistance is too short and red when the distance is too long. The distanceindicators can thereby guide the person under test to maintain the distancewithin the preferred range. The display may comprise one or more capture indicators arranged to indicatethat that the vision-based sensor 121 is capable of capturing the visualrepresentation of the eye 110, which means, i.a., that eye is in of the field of 14 view of the sensor and that the eye is in focus. The capture indicators canthereby guide the person under test to maintain capturing capabilities. Thecapture indicators may also be circles displayed on top of the pupils, whichappears on the display when vision-based sensor 121 is capable of capturingthe visual representation of the eye 110. According to aspects, the eye-scanner devices 120 comprises a speakerdevice 127 arranged to communicate instructions to the person. The speakerdevice 127 may be arranged on a smartphone 210, tablet, or a mobilecomputing device. The instruction could be, e.g., “look into the camera”, “thelight source 320 is about to be turned on, keep looking straight ahead", and“without moving your head, start looking to the left, continue moving your eyesuntil commanded to stop”. This is way, the person can be guided while not directly looking at the display, which is an advantage. The control unit 125 may be arranged in physical connection to the vision-based sensor 121, e.g., as part of the processing circuitry of a smart phone210. However, the control unit 135 may also be arranged distanced from thevision-based sensor in an external processing resource 130, such as a remoteserver or cloud-based processing resource. The vision-based sensor thenuploads captured image data of the eye 110 to the external processingresource, preferably via wireless link 140. The drug detection test is based on the four metrics discussed above.However, the test procedure can be configured in a number of different ways.For instance, each metric can be associated with a pass criterion. For instance,the pupil's size metric can be associated with a range of pupil sizes. lf the pupilsize 310 of the person under test is not within the acceptable range, then thetest is failed, otherwise it is passed. Similarly, the pupil's reaction to light metric320, the horizontal and/or vertical nystagmus metric 330, 340, and the eyecrossing ability metric 350 may be associated with respective pass criteria. Theperson is detected to be under likely influence of drugs if one or more of themetrics fail to meet the respective metric pass criterion. The drug detection test can also be more advanced, e.g., comprising aweighted combination of the four metrics. ln this case, if a person only barelypasses the four different tests, a likely drug influence may still be detectedbased on a combination of the four test-metrics. The systems and devices disclosed herein can also be used for gatheringstatistics on, e.g., drug influence. For instance, the control unit 125, 135 mayoptionally be arranged to gather one or more statistics or data pointsassociated with the metrics, to anonymize the gathered statistics, and to storethe anonymized statistics. This enables analysis of drug influence. The datapoints may also be associated with respective time stamps and/orgeographical positions. The geographical positions may be quantized on, e.g., city or area level in order to not impact anonymization. The control unit 125, 135 may optionally be arranged to gather one or moredata points associated with the metrics, to associate the gathered statisticswith the person under test, and to store the associated statistics. This way, itis possible to gather base values of the metrics of a person under test whenhe is not under the influence of a drug. During a subsequent measurement,the control unit may be arranged to compare the new test data with the storedbase values. Deviations between the new data and the base values can be anindication that the person is under the influence of a drug. The drug detectiontest may comprise a combination of comparing the test metrics to nominalvalues (i.e. predetermined threshold values) and comparing to the persons base values. With reference to Figure 4, the eye-scanner device 120 discussed above isarranged to execute a method comprising: S1 : Capturing a visual representation of the eye 110. This representation maycomprise one or more images, or a video sequence, or a combination of images and video sequence. S2: Determining a pupil size metric 310 based on the captured visualrepresentation. The pupil size may be estimated from the one or more images, 16 e.g., as a diameter of the pupil. The location and boundary of the pupil may bedetected by means of image processing and/or eye tracking techniques. S3: Determining a horizontal and/or vertical nystagmus metric 330, 340.Nystagmus may be detected by comparing a sequence of captured images or a video sequence of the eye 110 in order to detect, e.g., involuntary twitching. S4: Determine an ability to cross eyes metric 350 based on the visualrepresentation, i.e., an ability to move the gaze direction laterally in a controlledmanner. The person may be instructed to attempt to cross eyes or beinstructed to follow a marker on a display 220 or be instructed to follow anexternal object such as a pen which is moved in a way to cause eye crossing.Ability to cross eyes may be determined based on image processingtechniques. S5: Determine a measurement of pupil's reaction to light metric, where the lighthas a given intensity. A light source 320 arranged on or in connection to thevision-based sensor 121 may be used to stimulate a reaction, which reactioncan then be monitored based on the representation. The disclosed method then comprises determining S6 a level of drug influenceassociated with the person based on the determined metrics, i.e., based onthe abilities of the eye or pair of eyes. The determining of drug influence maybe a yes/no result, or a level on a scale. The determining may also be basedon a sequence of tests, where drug influence is determined based on an abilityof the person to pass the sequence of tests, e.g., associated with S2-S5 above. The detection is based on four measurements S2, S3, S4, S5, giving a robustdetection method based on the four metrics able to detect drug influence in a person. Figure 5 schematically illustrates, in terms of a number of functional units, thecomponents of a control unit 125, 135 according to an embodiment of thediscussions herein. The control unit may be comprised in a mobile device suchas a smart phone 210 or in an external processing resource 130. Processingcircuitry 510 is provided using any combination of one or more of a suitablecentral processing unit CPU, multiprocessor, microcontroller, digital signal 17 processor DSP, etc., capable of executing software instructions stored in acomputer program product, e.g. in the form of a storage medium 530. Theprocessing circuitry 510 may further be provided as at least one applicationspecific integrated circuit ASIC, or field programmable gate array FPGA. Particularly, the processing circuitry 510 is configured to cause the control unit125, 135 to perform a set of operations, or steps, such as the methodsdiscussed in connection to Figure 4. For example, the storage medium 530may store the set of operations, and the processing circuitry 510 may beconfigured to retrieve the set of operations from the storage medium 530 tocause the control unit 125, 135 to perform the set of operations. The set ofoperations may be provided as a set of executable instructions. Thus, theprocessing circuitry 510 is thereby arranged to execute methods as herein disclosed. The storage medium 530 may also comprise persistent storage, which, forexample, can be any single one or combination of magnetic memory, opticalmemory, solid state memory, memory card or even remotely mounted memory. The control unit 125, 135 may further comprise an interface 520 forcommunications with at least one external device 540. As such the interface520 may comprise one or more transmitters and receivers, comprisinganalogue and digital components and a suitable number of ports for wireline or wireless communication. The processing circuitry 510 controls the general operation of the control unit125, 135 e.g. by sending data and control signals to the interface 520 and thestorage medium 530, by receiving data and reports from the interface 520, andby retrieving data and instructions from the storage medium 530. Othercomponents, as well as the related functionality, of the control node are omittedin order not to obscure the concepts presented herein. Figure 6 schematically illustrates a computer program product 600, comprisinga set of operations 610 executable by the control unit 125, 135. The set ofoperations 610 may be loaded into the storage medium 530. The set of 18 operations may correspond to the methods discussed above in connection toFigure 4 or to the different operations by the discussed above. ln the example of Figure 6, the computer program product 600 is i|ustrated asan optical disc, such as a CD (compact disc) or a DVD (digital versatile disc)or a Blu-Ray disc. The computer program product could also be embodied asa memory, such as a random-access memory (RAIVI), a read-only memory(ROIVI), an erasable programmable read-only memory (EPRONI), or anelectrically erasable programmable read-only memory (EEPRONI) and moreparticularly as a non-volatile storage medium of a device in an externalmemory such as a USB (Universal Serial Bus) memory or a Flash memory,such as a compact Flash memory. Thus, while the computer program is hereschematically shown as a track on the depicted optical disk, the computerprogram can be stored in any way which is suitable for the computer programproduct. Below follows an example scenario demonstrating the eye-scanner device andthe disclosed method. The person under test has access to the eye-scannerdevice, which in this case comprises a smart phone. First, an eye-scannerapplication is started, which comprises five subsequent stages: START, whichis the first interface that introduces the person under test to the applicationwherein a test can be initiated; PREANIBLE, which is an interface presentingthe stages and instructions, and gives the possibility to terminate; TEST DATA,which is an interface wherein information about the person under test can beinputted; IVIEASURENIENTS, which starts the measurements and displays ameasurement interface; and RESULTS, which is an interface displayingresults. Details about the five stages follow below. STA RT The start interface is the first interface when the application is launched. ltdisplays that the eye-scanner application has been launched and enablesinitialization of testing. After the subsequent stages have been cycled through,the application is returned to the start interface. PREAIVIBLE 19 The preamble interface is Iaunched after the initialization of testing. Thisinterface instructs and prepares the person under test for testing, i.e. what willhappen (the four tests) and what is required of the person under test. Thisinterface thereby allows the person under test to confirm that a test should bestarted, which is useful if the initialization of testing is initiated by accident onthe start interface. The preamble interface also comprises means fortermination, which returns the application to the start interface, and means for confirmation, which is a continuation to the next stage.TEST DATA The test data interface takes information about the person under test as input,such as age and sex. The information can be useful for setting threshold valuesfor the four test-metrics indicating influence of drugs. The information can alsorelate to drug intake, such as type of drug, quantity, and time of use, which canbe useful when determining threshold values of the test metrics in a controlled experiment. The test data interface also comprises means for termination, which returnsthe application to the start interface, and means for confirmation, which is acontinuation to the next stage. Optionally, the confirmation can only be initiatedwhen a minimum amount of information has been inputted, such as age and SGX.IVIEASUREIVIENTS The four tests may be carried out in a single sequence, or they may be initiatedindividually. When the measurements interface is initiated, the camera isturned on and a live feed of the camera is displayed. When the camera sensor121 is capable of capturing the visual representation of the eye 110, captureindicators in the form of circles displayed on top of the pupils appears on thelive feed, and remain there as long as the camera sensor 121 is capable ofcapturing the visual representation of the eye 110. This can increase theprobability of a successful capture during one of the four tests. The captureindicators can thereby guide the person under test to maintain capturingcapabilities. The measurements interface also comprise a distance indicator that indicate that the distance between the camera sensor 121 and the eye110 of the person under test is within a preferred range. This provides clearfeedback to the person under test which enables the preferred distance to be maintained. When the camera sensor can capture, and when the distance is in thepreferred range, the four tests may be carried out automatically or be initiatedby the person under test. An information view successively shows informationabout the ongoing test and, together with markers, guides the person undertest what to do. When the four tests have been successfully completed, the measurement are saved, and the next stage is initiated automatically. The measurements interface comprise three parts: an information view, whichshows information about the current test (out of four) and provides feedbackto the person under test; a film view, which shows a live feed of the camera; and means for termination. The information view provides information about the ongoing test. After one ofthe four tests has successfully been completed, the information view providesnew information about a new ongoing test. The information view comprises aheadline stating the ongoing test and an instruction to the person under test,e.g. “look into the camera”, “the light source 320 is about to be turned on, keeplooking straight ahead", and “without moving your head, start looking to theleft, continue moving your eyes until commanded to stop”. The informationview also comprises a capture indicator and a distance indicator. Theinformation view further comprises a counter that counts down the time to thestart of a test. The person under test thereby has time to move his eyes intoposition before the test starts. The information view also comprises informationabout what is currently being processed, e.g. “looking for the pupil”. Some orall of the information on the information view may also be communicated to the person through voice messages (which may be pre-recorded). The film view is a live feed of the camera. This provides feedback to the personunder test such that he can hold the phone properly. The capture indicators inthe form of circles displayed on top of the pupils reinforce this feedback. 21 The means for termination allows the person under test to abort the tests andreturn to the start interface. Optionally, it is also possible to pause the tests ifthey are done in a sequence. lt may also be possible to restart the tests, i.e. toreturn to the beginning of the measurement stage. RESULTS The results interface is initiated after the measurement stage is successfullycompleted. The analysis of the measurement metrics can take place on theprocessing circuitry of the smart phone or on an external processing resource130, such as a remote server or cloud-based processing resource. lt is alsopossible that some of the analysis is carried out on the processing circuitry ofthe smart phone and the rest on the external processing resource. ln any case,the results interface shows that all tests have been completed and a summary of the results, which comprises age, sex, date, and a test code. After analysis of the measurement metrics takes place, a recommendation forfurther management may be provided on the results interface. E.g.,recommendation for additional drug tests such as blood or urine tests. The test code is a unique code for a completed cycle of the stages. lt can beused for connecting the four tests to the results of an additional drug tests suchas blood or urine tests. The results interface also comprises means for termination, which returns the application to the start interface.
权利要求:
Claims (21) [1] 1. An eye-scanner device (120) for detecting drug influence in a personbased on a visual representation of an eye (110) of the person, the devicecomprising a vision-based sensor (121), a light source (122) and a control unit(125, 135), wherein the vision-based sensor (121) is arranged to capture avisual representation of the eye (110), the control unit (125, 135) is arrangedto determine a pupil size metric (310), a pupil's reaction to light metric (320), ahorizontal and/or vertical nystagmus metric (330, 340), and an eye crossingability metric (350), wherein the control unit (125, 135) is arranged to detectdrug influence in the person based on the pupil size metric (310), the pupil'sreaction to light metric (320), the horizontal and/or vertical nystagmus metric(330, 340), and the eye crossing ability metric (350). [2] 2. The eye-scanner device (120) according to claim 1, wherein the vision-based sensor (121) comprises a camera sensor arranged on a smartphone(210), tablet, or a mobile computing device. [3] 3. The eye-scanner device (120) according to any previous claim, whereinthe light source (320) is comprised in a smartphone (210), tablet, or a mobile computing device. [4] 4. The eye-scanner device (120) according to any previous claim, whereinthe control unit (125) is arranged in physical connection to the vision-based sensor (121). [5] 5. The eye-scanner device (120) according to any previous claim, whereinthe control unit (125) is arranged distanced from the vision-based sensor in an external processing resource (130). [6] 6. The eye-scanner device (120) according to any previous claim, whereindrug influence in the person is detected if one or more of the metrics fail tomeet a respective metric pass criterion. [7] 7. The eye-scanner device (120) according to any previous claim, whereindrug influence in the person is detected based on a weighted combination ofthe metrics. [8] 8. The eye-scanner device (120) according to any previous claim, whereinthe vision-based sensor (121) is arranged to capture a time sequence ofimages of the eye (110). [9] 9. The eye-scanner device (120) according to any previous claim, whereinthe vision-based sensor (121) is arranged to capture a video sequence of theeye (110). [10] 10. The eye-scanner device (120) according to any previous claim, whereinthe control unit (125, 135) is arranged to gather one or more statisticsassociated with the metrics, to anonymize the gathered statistics, and to storethe anonymized statistics. [11] 11. The eye-scanner device (120) according to any previous claim,comprising a display (220) with a visible marker. [12] 12. The eye-scanner device (120) according to any previous claim,comprising a distance sensor (126) arranged to determine the distancebetween the vision-based sensor (121) and the eye (110) of the person. [13] 13. The eye-scanner device (120) according to claim 12, wherein thedistance sensor (126) comprises a camera sensor arranged on a smartphone (210), tablet, or a mobile computing device. [14] 14. The eye-scanner device (120) according to any of claims 12-13, whereinthe distance sensor (126) is arranged to be calibrated using a known referencein proximity to eye (110). [15] 15. The eye-scanner device (120) according to any of claims 12-14, wherein the distance sensor (126) comprises a plurality of camera sensors. [16] 16. The eye-scanner device (120) according to any of claims 12-15, whereinthe distance sensor (126) comprises a radar, a light detection and ranging(LIDAR) device, and/or any other light-based ranging device, such as depthsensing using infrared projection. [17] 17. The eye-scanner device (120) according to any previous claim, whereinthe display comprises one or more test indicators arranged to indicate that acapture of a visual representation of the eye (110) is ongoing. [18] 18. The eye-scanner device (120) according to any previous claim, whereinthe display comprises one or more distance indicators arranged to indicate thatthe distance between the vision-based sensor (121) and the eye (110) of theperson is within a preferred range. [19] 19. The eye-scanner device (120) according to any previous claim, whereinthe display comprises one or more capture indicators arranged to indicate thatthat the vision-based sensor (121) is capable of capturing the visualrepresentation of the eye (110). [20] 20. The eye-scanner device (120) according to any previous claim,comprising a speaker device (127) arranged to communicate instructions tothe person. [21] 21. The eye-scanner device (120) according to any previous claim, whereinthe pupi| size metric (310), the pupil's reaction to light metric (320), thehorizontal and/or vertical nystagmus metric (330, 340), and the eye crossingability metric (350) are, at least partly, determined based on relative meaSUfementS.
类似技术:
公开号 | 公开日 | 专利标题 US9202121B2|2015-12-01|Liveness detection US9489574B2|2016-11-08|Apparatus and method for enhancing user recognition CN102149325B|2013-01-02|Line-of-sight direction determination device and line-of-sight direction determination method US10413172B2|2019-09-17|Digital visual acuity eye examination for remote physician assessment US20130301007A1|2013-11-14|Apparatus to measure accommodation of the eye CN108828338B|2021-09-17|External shunt detection device and method with temperature monitoring function SE2030188A1|2021-02-28|Device for detecting drug influence based on visual data WO2021037788A1|2021-03-04|Device for detecting drug influence based on visual data CN111783640A|2020-10-16|Detection method, device, equipment and storage medium CN110291771B|2021-11-16|Depth information acquisition method of target object and movable platform US10609311B2|2020-03-31|Method and device for increasing resolution of an image sensor JP6658273B2|2020-03-04|Assembly inspection support method and assembly inspection support program KR102161488B1|2020-10-05|Apparatus and method for displaying product in 3 dimensions US20210224752A1|2021-07-22|Work support system and work support method JP6990482B1|2022-01-12|Inspection system, method and program US20200310460A1|2020-10-01|System and method for validating readings of orientation sensor mounted on autonomous ground vehicle KR20210028439A|2021-03-12|Method of assessing the psychological state through the drawing process of the subject and computer program JP2014000254A|2014-01-09|Diagnosis support apparatus and diagnosis support method CN111543933A|2020-08-18|Vision detection method and device and intelligent electronic equipment JP6403058B2|2018-10-10|Reaction range measuring apparatus and program CN113143194A|2021-07-23|Eyesight test method based on mobile terminal, mobile terminal and system CN111582157A|2020-08-25|Human body recognition method, device, equipment and computer readable storage medium CN113673503A|2021-11-19|Image detection method and device CN109360310A|2019-02-19|Biometric discrimination method, systems-on-a-chip and channel unit CN113143193A|2021-07-23|Intelligent vision testing method, device and system
同族专利:
公开号 | 公开日
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 SE1930273|2019-08-27|PCT/EP2020/073610| WO2021037788A1|2019-08-27|2020-08-24|Device for detecting drug influence based on visual data| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|